Goto

Collaborating Authors

 maximum conditional likelihood


Maximum Conditional Likelihood via Bound Maximization and the CEM Algorithm

Neural Information Processing Systems

We present the CEM (Conditional Expectation Maximi::ation) al(cid:173) gorithm as an extension of the EM (Expectation M aximi::ation) algorithm to conditional density estimation under missing data. A bounding and maximization process is given to specifically optimize conditional likelihood instead of the usual joint likelihood. We ap(cid:173) ply the method to conditioned mixture models and use bounding techniques to derive the model's update rules . Monotonic conver(cid:173) gence, computational efficiency and regression results superior to EM are demonstrated.


IPF for Discrete Chain Factor Graphs

Wiegerinck, Wim, Heskes, Tom

arXiv.org Artificial Intelligence

Iterative Proportional Fitting (IPF), combined with EM, is commonly used as an algorithm for likelihood maximization in undirected graphical models. In this paper, we present two iterative algorithms that generalize upon IPF. The first one is for likelihood maximization in discrete chain factor graphs, which we define as a wide class of discrete variable models including undirected graphical models and Bayesian networks, but also chain graphs and sigmoid belief networks. The second one is for conditional likelihood maximization in standard undirected models and Bayesian networks. In both algorithms, the iteration steps are expressed in closed form. Numerical simulations show that the algorithms are competitive with state of the art methods.


Maximum Conditional Likelihood via Bound Maximization and the CEM Algorithm

Jebara, Tony, Pentland, Alex

Neural Information Processing Systems

Advantages in feature selection, robustness andlimited resource allocation have been studied. Ultimately, tasks such as regression and classification reduce to the evaluation of a conditional density. However, popularity of maximumjoint likelihood and EM techniques remains strong in part due to their elegance and convergence properties. Thus, many conditional problems are solved by first estimating joint models then conditioning them.


Maximum Conditional Likelihood via Bound Maximization and the CEM Algorithm

Jebara, Tony, Pentland, Alex

Neural Information Processing Systems

We present the CEM (Conditional Expectation Maximi::ation) algorithm as an extension of the EM (Expectation M aximi::ation) algorithm to conditional density estimation under missing data. A bounding and maximization process is given to specifically optimize conditional likelihood instead of the usual joint likelihood. We apply the method to conditioned mixture models and use bounding techniques to derive the model's update rules. Monotonic convergence, computational efficiency and regression results superior to EM are demonstrated.


Maximum Conditional Likelihood via Bound Maximization and the CEM Algorithm

Jebara, Tony, Pentland, Alex

Neural Information Processing Systems

We present the CEM (Conditional Expectation Maximi::ation) algorithm as an extension of the EM (Expectation M aximi::ation) algorithm to conditional density estimation under missing data. A bounding and maximization process is given to specifically optimize conditional likelihood instead of the usual joint likelihood. We apply the method to conditioned mixture models and use bounding techniques to derive the model's update rules. Monotonic convergence, computational efficiency and regression results superior to EM are demonstrated.